3,816 research outputs found
Recommended from our members
Modeling emergency departments using discrete event simulation techniques
This paper discusses the application of Discrete Event Simulation (DES) for modeling the operations of an Emer-gency Department (ED). The model was developed to help the ED managers understand the behavior of the system with regards to the hidden causes of excessive waiting times. It served as a tool for assessing the impact of major departmental resources on Key Performance Indicators (KPIs), and was also used as a cost effective method for testing various what-if scenarios for possible system im-provement. The study greatly enhanced managersâ under-standing of the system and how patient flow is influenced by process changes and resource availability. The results of this work also helped managers to either reverse or modify some proposed changes to the system that were previously being considered. The results also show a possible reduc-tion of more than 20% in patients waiting times
Recommended from our members
Service attribute importance and strategic planning: An empirical study
There is growing evidence that attribute importance is a function of attribute performance. Several studies reported that service quality attributes fall into three categories: basic, performance, and excitement. Thus, the identification of attribute importance is significantly important as a key to customer satisfaction evaluation and other behavioural intentions. According to customer behaviour literature, attribute importance can be measured in two ways: (1) self-stated importance, and (2) statistically inferred importance. The article evaluates two methods according to their impact on overall customer satisfaction measurement and, managerial implementation. A case study is conducted on the telecommunication industry for analysis
Recommended from our members
A computer-based product classification and component detection for demanufacturing processes
This is an Author's Accepted Manuscript of an article published in International Journal of Computer Integrated
Manufacturing, 24(10), 900-914, 2011 [copyright Taylor & Francis], available online at:
http://www.tandfonline.com/10.1080/0951192X.2011.579169.The aim of this paper is to propose a novel computer-based product classification, component detection and tracking for demanufacturing and disassembly process. This is achieved by introducing a series of automated and sequential product scanning, component identification, image analysis and sorting â leading to the development of a bill of material (BOM). The produced BOM can then be associated with the relevant disassembly/demanufacture proviso. The proposed integrated image sorting and product classification (ISPC) approach can be considered as a step forward in automation of demanufacturing activities. The ISPC model proposed in this paper utilises and builds on the state-of-the-art technology and current body of research in computer-integrated demanufacturing and remanufacturing (CIDR). An appraisal of the latest research material and the factors that inhibit CIDR methods inpractice are presented. A novel solution for the integration of imaging and material identification techniques toovercome some of the existing shortcomings of automated recycling processes is proposed in this paper. The proposed product scanning and component detection ISPC software consists of four distinct models: the repertory database, the search engine, the product-attributes updater and the image sorting and classification algorithm. The software framework that integrates the four components is presented in this paper. Finally, an overall assessment of applying ISPC at various stages of CIDR processes concludes the article.University of Ibadan MacArthur Foundation Gran
Recommended from our members
A review of historical developments of quality assessment in industry and healthcare
Purpose: This study reviewed the literature on the historical development of quality assessment methods in industry and in healthcare. A comparative analysis of quality methods in industry and healthcare was conducted to examine the gap between methods in the two sectors. An attempt was then made to examine the latest approaches to quality assessment in healthcare and finally a proposal has been offered for a more effective approach to tackling the problem of quality in healthcare.
Design/methodology/approach:
Firstly, a review of the evolution of quality assessment in industry and healthcare was conducted. This was based on books written by prominent experts in the field of quality. secondly, a study of the current approaches in healthcare was undertaken. Publications from varied sources were selected and reviewed. The literature consulted includes worldwide operations research and healthcare sources including dissertations, the internet and reference lists of relevant articles.
The journal papers and conference proceedings were selected according to the following criteria: Objective: the study must be aimed at measuring or improving quality both. It could also be aimed at developing new ways of measuring the quality of health care; Method: observational studies, experimental trials or systematic reviews; Setting: study should be in a hospital setting and not narrowed to quality of clinical cares.
Findings: This study showed that the concept of quality management and its control in healthcare is not as advanced as it is in industry. Moreover, it seemed that most researchers, who set out to assess quality of care in one way or the other, have had differing views of quality and the factors that contribute to its assessment. It was also deduced that the way forward in healthcare quality is the development of systems that give staff ownership and pride in a way that is akin to the era of the craftsmen
Flexible data input layer architecture (FDILA) for quick-response decision making tools in volatile manufacturing systems
This paper proposes the foundation for a flexible data input management system as a vital part of a generic solution for quick-response decision making. Lack of a comprehensive data input layer between data acquisition and processing systems has been realized and thought of. The proposed FDILA is applicable to a wide variety of volatile manufacturing environments. It provides a generic platform that enables systems designers to define any number of data entry points and types regardless of their make and specifications in a standard fashion. This is achieved by providing a variable definition layer immediately on top of the data acquisition layer and before data pre-processing layer. For proof of concept, National Instrumentsâ Labview data acquisition software is used to simulate a typical shop floor data acquisition system. The extracted data can then be fed into a data mining module that builds cost modeling functions involving the plantâs Key Performance Factors
Recommended from our members
Using discrete event simulation (DES) to manage theatre operations in healthcare: An audit-based case study
This paper discusses the application of Discrete Event Simulation (DES) in modelling the complex relationship between patient types, case-mix and operating theatre allocation in a large National Health Service (NHS) Trust in London. The simulation model that was constructed described the main features of nine theatres, focusing on operational processes and patient throughput times. The model was used to test three scenarios of case-mix and to demonstrate the potential of using simulation modelling as a cost effective method for understanding the issues of healthcare operations management and the role of simulation techniques in problem solving. The results indicated that removing all day cases will reduce patient throughput by 23.3% and the utilization of the orthopaedic theatre in particular by 6.5%. This represents a case example of how DES can be used by healthcare managers to inform decision making
Scheduling of non-repetitive lean manufacturing systems under uncertainty using intelligent agent simulation
World-class manufacturing paradigms emerge from specific types of manufacturing systems with which they remain associated until they are obsolete. Since its introduction the lean paradigm is almost exclusively implemented in repetitive manufacturing systems employing flow-shop layout configurations. Due to its inherent complexity and combinatorial nature, scheduling is one application domain whereby the implementation of manufacturing philosophies and best practices is particularly challenging. The study of the limited reported attempts to extend leanness into the scheduling of non-repetitive manufacturing systems with functional shop-floor configurations confirms that these works have adopted a similar approach which aims to transform the system mainly through reconfiguration in order to increase the degree of manufacturing repetitiveness and thus facilitate the adoption of leanness. This research proposes the use of leading edge intelligent agent simulation to extend the lean principles and techniques to the scheduling of non-repetitive production environments with functional layouts and no prior reconfiguration of any form. The simulated system is a dynamic job-shop with stochastic order arrivals and processing times operating under a variety of dispatching rules. The modelled job-shop is subject to uncertainty expressed in the form of high priority orders unexpectedly arriving at the system, order cancellations and machine breakdowns. The effect of the various forms of the stochastic disruptions considered in this study on system performance prior and post the introduction of leanness is analysed in terms of a number of time, due date and work-in-progress related performance metrics
Event tracking for real-time unaware sensitivity analysis (EventTracker)
This is the author's accepted manuscript. The final published article is available from the link below. Copyright @ 2013 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other users, including reprinting/ republishing this material for advertising or promotional purposes, creating new collective works for resale or redistribution to servers or lists, or reuse of any copyrighted components of this work in other works.This paper introduces a platform for online Sensitivity Analysis (SA) that is applicable in large scale real-time data acquisition (DAQ) systems. Here we use the term real-time in the context of a system that has to respond to externally generated input stimuli within a finite and specified period. Complex industrial systems such as manufacturing, healthcare, transport, and finance require high quality information on which to base timely responses to events occurring in their volatile environments. The motivation for the proposed EventTracker platform is the assumption that modern industrial systems are able to capture data in real-time and have the necessary technological flexibility to adjust to changing system requirements. The flexibility to adapt can only be assured if data is succinctly interpreted and translated into corrective actions in a timely manner. An important factor that facilitates data interpretation and information modelling is an appreciation of the affect system inputs have on each output at the time of occurrence. Many existing sensitivity analysis methods appear to hamper efficient and timely analysis due to a reliance on historical data, or sluggishness in providing a timely solution that would be of use in real-time applications. This inefficiency is further compounded by computational limitations and the complexity of some existing models. In dealing with real-time event driven systems, the underpinning logic of the proposed method is based on the assumption that in the vast majority of cases changes in input variables will trigger events. Every single or combination of events could subsequently result in a change to the system state. The proposed event tracking sensitivity analysis method describes variables and the system state as a collection of events. The higher the numeric occurrence of an input variable at the trigger level during an event monitoring interval, the greater is its impact on the final analysis of the system state. Experiments were designed to compare the proposed event tracking sensitivity analysis method with a comparable method (that of Entropy). An improvement of 10% in computational efficiency without loss in accuracy was observed. The comparison also showed that the time taken to perform the sensitivity analysis was 0.5% of that required when using the comparable Entropy based method.EPSR
Estimation of sexual behavior in the 18-to-24-years-old Iranian youth based on a crosswise model study
Background: In many countries, negative social attitude towards sensitive issues such as sexual behavior has resulted in false and invalid data concerning this issue.This is an analytical cross-sectional study, in which a total number of 1500 single students from universities of Shahroud City were sampled using a multi stage technique. The students were assured that their information disclosed for the researcher will be treated as private and confidential. The results were analyzed using crosswise model, Crosswise Regression, T-test and Chi-square tests. Findings. It seems that the prevalence of sexual behavior among Iranian youth is 41% (CI = 36-53). Conclusion: Findings showed that estimation sexual relationship in Iranian single youth is high. Thus, devising training models according to the Islamic-Iranian culture is necessary in order to prevent risky sexual behavior. Š 2014 Vakilian et al.; licensee BioMed Central Ltd
- âŚ